Goto

Collaborating Authors

 copilot designer


Microsoft ignored safety problems with AI image generator, engineer complains

The Guardian

An artificial intelligence engineer at Microsoft published a letter Wednesday alleging that the company's AI image generator lacks basic safeguards against creating violent and sexualized images. In the letter, engineer Shane Jones states that his repeated attempts to warn Microsoft management about the problems failed to result in any action. Jones said he sent the message to the Federal Trade Commission and Microsoft's board of directors. "Internally the company is well aware of systemic issues where the product is creating harmful images that could be offensive and inappropriate for consumers," Jones states in the letter, which he published on LinkedIn. He lists his title as "principal software engineering manager".


Microsoft engineer who raised concerns about Copilot image creator pens letter to the FTC

Engadget

Microsoft engineer Shane Jones raised concerns about the safety of OpenAI's DALL-E 3 back in January, suggesting the product has security vulnerabilities that make it easy to create violent or sexually explicit images. He also alleged that Microsoft's legal team blocked his attempts to alert the public to the issue. Now, he has taken his complaint directly to the FTC, as reported by CNBC. "I have repeatedly urged Microsoft to remove Copilot Designer from public use until better safeguards could be put in place," Jones wrote in a letter to FTC Chair Lina Khan. He noted that Microsoft "refused that recommendation" so now he's asking the company to add disclosures to the product to alert consumers to the alleged danger. Jones also wants the company to change the rating on the app to make sure it's only for adult audiences.